Goto

Collaborating Authors

 ml sensor


Machine Learning Sensors for Diagnosis of COVID-19 Disease Using Routine Blood Values for Internet of Things Application

Velichko, Andrei, Huyut, Mehmet Tahir, Belyaev, Maksim, Izotov, Yuriy, Korzun, Dmitry

arXiv.org Artificial Intelligence

Healthcare digitalization requires effective applications of human sensors, when various parameters of the human body are instantly monitored in everyday life due to the Internet of Things (IoT). In particular, machine learning (ML) sensors for the prompt diagnosis of COVID-19 are an important option for IoT application in healthcare and ambient assisted living (AAL). Determining a COVID-19 infected status with various diagnostic tests and imaging results is costly and time-consuming. This study provides a fast, reliable and cost-effective alternative tool for the diagnosis of COVID-19 based on the routine blood values (RBVs) measured at admission. The dataset of the study consists of a total of 5296 patients with the same number of negative and positive COVID-19 test results and 51 routine blood values. In this study, 13 popular classifier machine learning models and the LogNNet neural network model were exanimated. The most successful classifier model in terms of time and accuracy in the detection of the disease was the histogram-based gradient boosting (HGB) (accuracy: 100%, time: 6.39 sec). The HGB classifier identified the 11 most important features (LDL, cholesterol, HDL-C, MCHC, triglyceride, amylase, UA, LDH, CK-MB, ALP and MCH) to detect the disease with 100% accuracy. In addition, the importance of single, double and triple combinations of these features in the diagnosis of the disease was discussed. We propose to use these 11 features and their binary combinations as important biomarkers for ML sensors in the diagnosis of the disease, supporting edge computing on Arduino and cloud IoT service.


The emerging world of ML sensors

#artificialintelligence

Today, we live in the era of AI scaling. It seems like everywhere you look people are pushing to make large language models larger, or more multi-modal and leveraging ungodly amounts of processing power to do it. At the far opposite extreme from the world of hyperscale transformers and giant dense nets is the fast-evolving world of TinyML, where the goal is to pack AI systems onto small edge devices. My guest today is Matthew Stewart, a deep learning and TinyML researcher at Harvard University, where he collaborates with the world's leading IoT and TinyML experts on projects aimed at getting small devices to do big things with AI. Recently, along with his colleagues, Matt co-authored a paper that introduced a new way of thinking about sensing.


Machine Learning Sensors: Truly Data-Centric AI

#artificialintelligence

"Paradoxically, data is the most under-valued and de-glamorised aspect of AI" -- Google research authors of "Data Cascades in High-Stakes AI." "Data is food for AI" -- Andrew Ng, UC Berkeley professor and pioneer of the data-centric AI philosophy. Machine learning has seen a bifurcation towards both smaller and larger models in recent years. Large-scale language models with hundreds of billions of parameters are being released regularly, and, with no signs of performance saturation, we can expect to see this trend continue. On the flip side, the field of tiny machine learning (TinyML) -- deploying machine learning models on resource-constrained microcontrollers -- is also starting to take hold. Commercial applications for TinyML already exist, from keyword spotting in smartphones (e.g., "Hey Siri" and "OK Google") to person detection for controlling intelligent lighting, HVAC, and security systems.